Statistical Learning : CVloo stability is sufficient for generalization and necessary and sufficient for consistency of Empirical Risk Minimization

نویسندگان

  • Sayan Mukherjee
  • Partha Niyogi
  • Tomaso Poggio
  • Ryan Rifkin
چکیده

Solutions of learning problems by Empirical Risk Minimization (ERM) – and almost-ERM when the minimizer does not exist – need to be consistent, so that they may be predictive. They also need to be well-posed in the sense of being stable, so that they might be used robustly. We propose a statistical form of stability, defined in terms of the property of cross-validation leaveone-out (CVloo) stability, which is sufficient, in general, for generalization, that is convergence in probability of the empirical error to the expected error. Our second observation is that for bounded loss classes, CVloo stability of ERM is necessary and sufficient for consistency of ERM. We conclude that CVloo stability is the weakest form of stability which is sufficient for generalization for general learning algorithms while being necessary and sufficient for consistency of ERM. We discuss stronger forms of stability and their relations with “small” uGC hypothesis spaces, such as VC-classes and balls in a Sobolev or a RKHS space. This report describes research done within the Center for Biological & Computational Learning in the Department of Brain & Cognitive Sciences and in the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology. This research was sponsored by grants from: Office of Naval Research (DARPA) under contract No. N00014-00-1-0907, National Science Foundation (ITR) under contract No. IIS-0085836, National Science Foundation (KDI) under contract No. DMS-9872936, and National Science Foundation under contract No. IIS-9800032 Additional support was provided by: Central Research Institute of Electric Power Industry, Center for e-Business (MIT), Eastman Kodak Company, DaimlerChrysler AG, Compaq, Honda R&D Co., Ltd., Komatsu Ltd., Merrill-Lynch, NEC Fund, Nippon Telegraph & Telephone, Siemens Corporate Research, Inc., The Whitaker Foundation, and the SLOAN Foundations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

S ep 2 01 1 Online Learning , Stability , and Stochastic Gradient Descent September 9 , 2011

In batch learning, stability together with existence and uniqueness of the solution corresponds to well-posedness of Empirical Risk Minimization (ERM) methods; recently, it was proved that CVloo stability is necessary and sufficient for generalization and consistency of ERM ([9]). In this note, we introduce CVon stability, which plays a similar role in online learning. We show that stochastic g...

متن کامل

2 5 M ay 2 01 1 Online Learning , Stability , and Stochastic Gradient Descent May 26 , 2011

In batch learning, stability together with existence and uniqueness of the solution corresponds to well-posedness of Empirical Risk Minimization (ERM) methods; recently, it was proved that CVloo stability is necessary and sufficient for generalization and consistency of ERM ([2]). In this note, we introduce CVon stability, which plays a similar role in online learning. We show that stochastic g...

متن کامل

Statistical Learning : stability is sufficient for generalization and necessary and sufficient for consistency of Empirical Risk Minimization

Solutions of learning problems by Empirical Risk Minimization (ERM) – and almost-ERM when the minimizer does not exist – need to be consistent, so that they may be predictive. They also need to be well-posed in the sense of being stable, so that they might be used robustly. We propose a statistical form of leave-one-out stability, called CVEEEloo stability. Our main new results are two. We prov...

متن کامل

Statistical Learning : CVEEEloo stability is sufficient for generalization and necessary and sufficient for consistency of Empirical Risk Minimization

Solutions of learning problems by Empirical Risk Minimization (ERM) – and almost-ERM when the minimizer does not exist – need to be consistent, so that they may be predictive. They also need to be well-posed in the sense of being stable, so that they might be used robustly. We propose a statistical form of leave-one-out stability, called CVEEEloo stability. We prove that for bounded loss classe...

متن کامل

Statistical Learning : LOO stability is sufficient for generalization and necessary and sufficient for consistency of Empirical Risk Minimization

Solutions of learning problems by Empirical Risk Minimization (ERM) – and almost-ERM when the minimizer does not exist – need to be consistent, so that they may be predictive. They also need to be well-posed in the sense of being stable, so that they might be used robustly. We propose a statistical form of stability, defined as leave-one-out (LOO) stability. We prove that for bounded loss class...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003